Univariate hyperbolic tangent neural network approximation

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Neural Network Implementation of Hyperbolic Tangent Function Using Approximation Method

Neural network is mainly used in faster applications. Non linear activation function is one of the main building blocks in the neural networks. Hyperbolic and tangent sigmoid function used in neural network. The approximation is based on a mathematical analysis considering the maximum allowable error as design parameter. To derive the optimal finding of the number of input and output bits requi...

متن کامل

Hopfield neural network: The hyperbolic tangent and the piecewise-linear activation functions

This paper reports two-dimensional parameter-space plots for both, the hyperbolic tangent and the piecewise-linear neuron activation functions of a three-dimensional Hopfield neural network. The plots obtained using both neuron activation functions are compared, and we show that similar features are present on them. The occurrence of self-organized periodic structures embedded in chaotic region...

متن کامل

Universal Approximator Property of the Space of Hyperbolic Tangent Functions

In this paper, first the space of hyperbolic tangent functions is introduced and then the universal approximator property of this space is proved. In fact, by using this space, any nonlinear continuous function can be uniformly approximated with any degree of accuracy. Also, as an application, this space of functions is utilized to design feedback control for a nonlinear dynamical system.

متن کامل

Tangent Measure Distributions of Hyperbolic Cantor

Tangent measure distributions were introduced by Bandt 2] and Graf 8] as a means to describe the local geometry of self-similar sets generated by iteration of contractive similitudes. In this paper we study the tangent measure distributions of hyperbolic Cantor sets generated by certain contractive mappings, which are not necessarily similitudes. We show that the tangent measure distributions o...

متن کامل

Approximation with Diffusion-Neural-Network

Neural information processing models largely assume that the samples for training a neural network are sufficient. Otherwise there exist a non-negligible error between the real function and estimated function from a trained network. To reduce the error in this paper we suggest a diffusion-neural-network (DNN) to learn from a small sample. First, we show the principle of information diffusion us...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical and Computer Modelling

سال: 2011

ISSN: 0895-7177

DOI: 10.1016/j.mcm.2010.11.072